Shrinkage for categorical regressors

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Additive Regression Splines With Irrelevant Categorical and Continuous Regressors

We consider the problem of estimating a relationship using semiparametric additive regression splines when there exist both continuous and categorical regressors, some of which are irrelevant but this is not known a priori. We show that choosing the spline degree, number of subintervals, and bandwidths via cross-validation can automatically remove irrelevant regressors, thereby delivering ‘auto...

متن کامل

Scale-Space Based Weak Regressors for Boosting

Boosting is a simple yet powerful modeling technique that is used in many machine learning and data mining related applications. In this paper, we propose a novel scale-space based boosting framework which applies scale-space theory for choosing the optimal regressors during the various iterations of the boosting algorithm. In other words, the data is considered at different resolutions for eac...

متن کامل

Jointly Optimized Regressors for Image Super-resolution

Learning regressors from low-resolution patches to high-resolution patches has shown promising results for image super-resolution. We observe that some regressors are better at dealing with certain cases, and others with different cases. In this paper, we jointly learn a collection of regressors, which collectively yield the smallest superresolving error for all training data. After training, e...

متن کامل

Calibrated Prediction Intervals for Neural Network Regressors

Ongoing developments in neural network models are continually advancing the state-of-the-art in terms of system accuracy. However, the predicted labels should not be regarded as the only core output; also important is a well calibrated estimate of the prediction uncertainty. Such estimates and their calibration is critical in relation to robust handling of out of distribution events not observe...

متن کامل

Subspace Information Criterion for Sparse Regressors

Non-quadratic regularizers, in particular the ` 1 norm regularizer can yield sparse solutions that generalize well. In this work we propose the Generalized Subspace Information Criterion (GSIC) that allows to predict the generalization error for this useful family of regularizers. We show that under some technical assumptions GSIC is an asymptotically unbiased estimator of the generalization er...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Econometrics

سال: 2021

ISSN: 0304-4076

DOI: 10.1016/j.jeconom.2020.07.051